Mesh Adaptive Direct Search Algorithms for Constrained Optimization

نویسندگان

  • Charles Audet
  • John E. Dennis
چکیده

This paper introduces the Mesh Adaptive Direct Search (MADS) class of algorithms for nonlinear optimization. MADS extends the Generalized Pattern Search (GPS) class by allowing local exploration, called polling, in a dense set of directions in the space of optimization variables. This means that under certain hypotheses, including a weak constraint qualification due to Rockafellar, MADS can treat constraints by the extreme barrier approach of setting the objective to infinity for infeasible points and treating the problem as unconstrained. The main GPS convergence result is to identify limit points where the Clarke generalized derivatives are nonnegative in a finite set of directions, called refining directions. Although in the unconstrained case, nonnegative combinations of these directions spans the whole space, the fact that there can only be finitely many GPS refining directions limits rigorous justification of the barrier approach to finitely many constraints for GPS. The MADS class of algorithms extend this result; the set of refining directions may even be dense in Rn, although we give an example where it is not. We present an implementable instance of MADS, and we illustrate and compare it with GPS on some test problems. We also illustrate the limitation of our results with examples.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence of Mesh Adaptive Direct Search to Second-Order Stationary Points

A previous analysis of second-order behavior of generalized pattern search algorithms for unconstrained and linearly constrained minimization is extended to the more general class of mesh adaptive direct search (MADS) algorithms for general constrained optimization. Because of the ability of MADS to generate an asymptotically dense set of search directions, we are able to establish reasonable c...

متن کامل

Second-Order Convergence of Mesh-Adaptive Direct Search

Abstract: A previous analysis of second-order behavior of pattern search algorithms for unconstrained and linearly constrained minimization is extended to the more general class of mesh adaptive direct search (MADS) algorithms for general constrained optimization. Because of the ability of MADS to generate an asymptotically dense set of search directions, we are able to establish reasonable con...

متن کامل

Erratum: Mesh Adaptive Direct Search Algorithms for Constrained Optimization

In [SIAM J. Optim., 17 (2006), pp. 188-217] Audet and Dennis proposed the class of mesh adaptive direct search algorithms (MADS) for minimization of a nonsmooth function under general nonsmooth constraints. The notation used in the paper evolved since the preliminary versions and, unfortunately, even though the statement of Proposition 4.2 is correct, is not compatible with the final notation. ...

متن کامل

Novel Applications of Optimization to Molecule Design

We present results from the application of two conformational search methods genetic algorithms GA and parallel direct search methods for nding all of the low energy confor mations of a molecule that are within a certain energy of the global minimum Genetic algorithms are in a class of biologically motivated optimization methods that evolve a population of individ uals where individuals who are...

متن کامل

Use of quadratic models with mesh-adaptive direct search for constrained black box optimization

We consider derivative-free optimization, and in particular black box optimization, where the functions to minimize and the functions representing the constraints are given by black boxes without derivatives. Two fundamental families of methods are available: Model-based methods and directional direct search algorithms. This work exploits the flexibility of the second type of method in order to...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • SIAM Journal on Optimization

دوره 17  شماره 

صفحات  -

تاریخ انتشار 2006